Conjugate gradient acceleration of iteratively re-weighted least squares methods

نویسندگان

  • Massimo Fornasier
  • Steffen Peter
  • Holger Rauhut
  • Stephan Worm
چکیده

Iteratively Re-weighted Least Squares (IRLS) is a method for solving minimization problems involving non-quadratic cost functions, perhaps non-convex and non-smooth, which however can be described as the infimum over a family of quadratic functions. This transformation suggests an algorithmic scheme that solves a sequence of quadratic problems to be tackled efficiently by tools of numerical linear algebra. Its general scope and its usually simple implementation, transforming the initial non-convex and non-smooth minimization problem into a more familiar and easily solvable quadratic optimization problem, make it a versatile algorithm. It has been formulated for a variety of problems, such as robust statistical linear regression, total variation minimization in image processing, as the so-called Kačanov fixed point iteration for the solution of certain quasi-linear elliptic partial differential equations, for `τ -norm minimization for 0 < τ 6 1 in signal processing, and for nuclear norm minimization for low-rank matrix identification. However, despite its simplicity, versatility, and elegant analysis, the complexity of IRLS strongly depends on the way the solution of the successive quadratic optimizations is addressed. For the important special case of compressed sensing and sparse recovery problems in signal processing, we investigate theoretically and numerically how accurately one needs to solve the quadratic problems by means of the conjugate gradient (CG) method in each iteration in order to guarantee convergence. The use of the CG method may significantly speed-up the numerical solution of the quadratic subproblems, in particular, when fast matrix-vector multiplication (exploiting for instance the FFT) is available for the matrix involved. In addition, we study convergence rates. Our modified IRLS method outperforms state of the art first order methods such as Iterative Hard Thresholding (IHT) or Fast Iterative Soft-Thresholding Algorithm (FISTA) in many situations, especially in large dimensions. Moreover, IRLS is often able to recover sparse vectors from fewer measurements than required for IHT and FISTA.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Fast Logistic Regression for Data Mining, Text Classification and Link Detection

Previous work by the authors [1] demonstrated that logistic regression can be a fast and accurate data mining tool for life sciences datasets, competitive with modern tools like support vector machines and balltree based K-NN. This paper has two objectives. The first objective is a serious empirical comparison of logistic regression to several classical and modern learners on a variety of learn...

متن کامل

Sparse Radon transform with dual gradient ascent method

The Radon transform suffers from the typical problems of loss of resolution and aliasing that arise as a consequence of incomplete information, such as limited aperture and discretization. Sparseness in Radon domain, which is equivalent to assuming smooth amplitude variation in the transition between known and unknown (missing) data, is a valid and useful prior information (Trad et al., 2003). ...

متن کامل

On the Properties of Preconditioners for Robust Linear Regression

In this paper, we consider solving the robust linear regression problem, y = Ax+ ε by Newton’s method and iteratively reweighted least squares method. We show that each of these methods can be combined with preconditioned conjugate gradient least squares algorithm to solve large, sparse, rectangular systems of linear, algebraic equations efficiently. We consider the constant preconditioner A A ...

متن کامل

Model Selection for Kernel Probit Regression

The convex optimisation problem involved in fitting a kernel probit regression (KPR) model can be solved efficiently via an iteratively re-weighted least-squares (IRWLS) approach. The use of successive quadratic approximations of the true objective function suggests an efficient approximate form of leave-one-out cross-validation for KPR, based on an existing exact algorithm for the weighted lea...

متن کامل

Properties of Preconditioners for Robust Linear Regression

In this paper, we consider solving the robust linear regression problem by an inexact Newton method and an iteratively reweighted least squares method. We show that each of these methods can be combined with the preconditioned conjugate gradient least square algorithm to solve large, sparse systems of linear equations efficiently. We consider the constant preconditioner and preconditioners base...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Comp. Opt. and Appl.

دوره 65  شماره 

صفحات  -

تاریخ انتشار 2016